Delayed exponential fitting by best tensor rank-(R1, R2, R3) approximation
نویسندگان
چکیده
We present a subspace-based scheme for the estimation of the poles (angular-frequencies and damping-factors) of a sum of damped and delayed sinusoids. In our model each component is supported over a different time frame, depending on the delay parameter. Classical subspace based methods are not suited to handle signals with varying time-supports. In this contribution, we propose a solution based on the best rank-(R1, R2, R3) approximation of a partially structured Hankel tensor on which the data are mapped. We show, by means of an example, that our approach outperforms the current tensor and matrix-based approaches in terms of the accuracy of the damping parameter estimates.
منابع مشابه
A Newton-Grassmann Method for Computing the Best Multilinear Rank-(r1, r2, r3) Approximation of a Tensor
We derive a Newton method for computing the best rank-(r1, r2, r3) approximation of a given J × K × L tensor A. The problem is formulated as an approximation problem on a product of Grassmann manifolds. Incorporating the manifold structure into Newton’s method ensures that all iterates generated by the algorithm are points on the Grassmann manifolds. We also introduce a consistent notation for ...
متن کاملCross: Efficient Low-rank Tensor Completion
The completion of tensors, or high-order arrays, attracts significant attention in recent research. Current literature on tensor completion primarily focuses on recovery from a set of uniformly randomly measured entries, and the required number of measurements to achieve recovery is not guaranteed to be optimal. In addition, the implementation of some previous methods are NP-hard. In this artic...
متن کاملOn the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors
In this paper we discuss a multilinear generalization of the best rank-R approximation problem for matrices, namely, the approximation of a given higher-order tensor, in an optimal leastsquares sense, by a tensor that has prespecified column rank value, row rank value, etc. For matrices, the solution is conceptually obtained by truncation of the singular value decomposition (SVD); however, this...
متن کاملA Grassmann-Rayleigh Quotient Iteration for Dimensionality Reduction in ICA
We derive a Grassmann-Rayleigh Quotient Iteration for the computation of the best rank-(R1, R2, R3) approximation of higher-order tensors. We present some variants that allow for a very efficient estimation of the signal subspace in ICA schemes without prewhitening.
متن کاملLow-rank tensor completion: a Riemannian manifold preconditioning approach Supplementary material
A Proof and derivation of manifold-related ingredients The concrete computations of the optimization-related ingredients presented in the paper are discussed below. The total space is M := St(r1, n1) × St(r2, n2) × St(r3, n3) × Rr1×r2×r3 . Each element x ∈ M has the matrix representation (U1,U2,U3,G). Invariance of Tucker decomposition under the transformation (U1,U2,U3,G) 7→ (U1O1,U2O2,U3O3,G×...
متن کامل